Bridging Few-Shot Learning and Adaptation: New Challenges of Support-Query Shift
نویسندگان
چکیده
Few-Shot Learning (FSL) algorithms have made substantial progress in learning novel concepts with just a handful of labelled data. To classify query instances from classes encountered at test-time, they only require support set composed few samples. FSL benchmarks commonly assume that those queries come the same distribution as set. However, realistic setting, data is plausibly subject to change, situation referred Distribution Shift (DS). The present work addresses new and challenging problem under Support/ Query (FSQS) i.e., when are sampled related but different distributions. Our contributions following. First, we release testbed for FSQS, including datasets, relevant baselines protocol rigorous reproducible evaluation. Second, observe well-established unsurprisingly suffer considerable drop accuracy facing stressing significance our study. Finally, show transductive can limit inopportune effect DS. In particular, study both role Batch-Normalization Optimal Transport (OT) aligning distributions, bridging Unsupervised Domain Adaptation FSL. This results method efficiently combines OT celebrated Prototypical Networks. We bring compelling experiments demonstrating advantage method. opens an exciting line research by providing strong baselines. code available https://github.com/ebennequin/meta-domain-shift.
منابع مشابه
Few-Shot Adversarial Domain Adaptation
This work provides a framework for addressing the problem of supervised domain adaptation with deep models. The main idea is to exploit adversarial learning to learn an embedded subspace that simultaneously maximizes the confusion between two domains while semantically aligning their embedding. The supervised setting becomes attractive especially when there are only a few target data samples th...
متن کاملFew-shot Learning
Though deep neural networks have shown great success in the large data domain, they generally perform poorly on few-shot learning tasks, where a classifier has to quickly generalize after seeing very few examples from each class. The general belief is that gradient-based optimization in high capacity classifiers requires many iterative steps over many examples to perform well. Here, we propose ...
متن کاملOne-shot and few-shot learning of word embeddings
Standard deep learning systems require thousands or millions of examples to learn a concept, and cannot integrate new concepts easily. By contrast, humans have an incredible ability to do one-shot or few-shot learning. For instance, from just hearing a word used in a sentence, humans can infer a great deal about it, by leveraging what the syntax and semantics of the surrounding words tells us. ...
متن کاملon the comparison of keyword and semantic-context methods of learning new vocabulary meaning
the rationale behind the present study is that particular learning strategies produce more effective results when applied together. the present study tried to investigate the efficiency of the semantic-context strategy alone with a technique called, keyword method. to clarify the point, the current study seeked to find answer to the following question: are the keyword and semantic-context metho...
15 صفحه اولFew-Shot Learning with Graph Neural Networks
We propose to study the problem of few-shot learning with the prism of inference on a partially observed graphical model, constructed from a collection of input images whose label can be either observed or not. By assimilating generic message-passing inference algorithms with their neural-network counterparts, we define a graph neural network architecture that generalizes several of the recentl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Lecture Notes in Computer Science
سال: 2021
ISSN: ['1611-3349', '0302-9743']
DOI: https://doi.org/10.1007/978-3-030-86486-6_34